# Wikipedia pretrained
Roberta Base Indonesian 522M
MIT
An Indonesian pretrained model based on RoBERTa-base architecture, trained on Indonesian Wikipedia data, case insensitive.
Large Language Model Other
R
cahya
454
6
Distilbert Base En Fr Es De Zh Cased
Apache-2.0
This is a lightweight version of distilbert-base-multilingual-cased, supporting five languages: English, French, Spanish, German, and Chinese, while maintaining original accuracy.
Large Language Model
Transformers Supports Multiple Languages

D
Geotrend
35
0
Featured Recommended AI Models